Salary: ₹55 - ₹80 Lakhs/Annum Expected
Description:
Morgan Stanley is looking for a Data warehousing Engineer (Director – Data & Analytics Engineering) to join its growing Cybersecurity and Technology Risk team in Bengaluru.
About the Organization:
The Cybersecurity division at Morgan Stanley is focused on defending against advanced cyber threats and maintaining resilience across systems and data. Within this, the Tech Risk Gov & Controls (TRGC) team oversees governance and ensures risks are identified, mitigated, and managed across technology.
Role Overview:
As a Datawarehousing Engineer at the Director level, you’ll be responsible for building scalable data pipelines, managing data transformations, ensuring high data quality, and enabling analytics and reporting for business-critical insights. You’ll work across modern data platforms and contribute to designing systems that support decision-making and risk governance at scale.
Key Responsibilities:
- Design and develop scalable, high-performance data pipelines.
- Transform and model data for analytics, reporting, and governance needs.
- Ensure data integrity, accuracy, and availability across systems.
- Collaborate with teams to implement Snowflake-based solutions, optimize performance, and manage access controls.
- Monitor pipeline health, troubleshoot issues, and enforce best practices in data quality.
- Utilize workflow orchestration tools like Airflow or Dagster for automation.
- Implement and optimize ETL processes using dbt and Python.
- Support the design and implementation of REST APIs for data services.
- Collaborate with cross-functional teams in a fast-paced, Agile environment.
Key Technical Skills:
Python, Apache Spark (PySpark), Snowflake, NoSQL, Graph Database, dbt, Airflow/Dagster, SQL, Data Modeling, Git, CI/CD, REST APIs
Requirements:
- At least 4+ years of strong hands-on experience as a Data Engineer (Director-level candidates will typically bring 10+ years of overall experience).
- Proficiency in Python for scripting, ETL, and data manipulation.
- Expertise with Apache Spark or equivalent big data frameworks.
- Solid experience with Snowflake, including schema design, optimization, and warehouse management.
- Strong SQL and data modeling knowledge.
- Hands-on experience with dbt for transformations and data orchestration.
- Familiarity with Airflow or similar workflow tools.
- Strong communication, collaboration, and problem-solving skills.
- Experience working with version control and CI/CD practices.
Important Notice:
This job description and related content are owned by Morgan Stanley. We are only sharing this information to help job seekers find opportunities. For application procedures, status, or any related concerns, please contact Morgan Stanley directly. We do not process applications or respond to candidate queries.